Quartet-Based Learning of Shallow Latent Variables

نویسندگان

  • Tao Chen
  • Nevin Lianwen Zhang
چکیده

Hierarchical latent class(HLC) models are tree-structured Bayesian networks where leaf nodes are observed while internal nodes are hidden. We explore the following two-stage approach for learning HLC models: One first identifies the shallow latent variables – latent variables adjacent to observed variables – and then determines the structure among the shallow and possibly some other “deep” latent variables. This paper is concerned with the first stage. In earlier work, we have shown how shallow latent variables can be correctly identified from quartet submodels if one could learn them without errors. In reality, one does make errors when learning quartet submodels. In this paper, we study the probability of such errors and propose a method that can reliably identify shallow latent variables despite of the errors.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Quartet-Based Learning of Hierarchical Latent Class Models: Discovery of Shallow Latent Variables

Hierarchical latent class (HLC) models are tree-structured Bayesian networks where leaf nodes are observed while internal nodes are hidden. The currently most efficient algorithm for learning HLC models can deal with only a few dozen observed variables. While this is sufficient for some applications, more efficient algorithms are needed for domains with, e.g., hundreds of observed variables. Wi...

متن کامل

Discovering Hidden Variables in Noisy-Or Networks using Quartet Tests

We give a polynomial-time algorithm for provably learning the structure and parameters of bipartite noisy-or Bayesian networks of binary variables where the top layer is completely hidden. Unsupervised learning of these models is a form of discrete factor analysis, enabling the discovery of hidden variables and their causal relationships with observed data. We obtain an efficient learning algor...

متن کامل

Unfolding Latent Tree Structures using 4th Order Tensors

Discovering the latent structure from many observed variables is an important yet challenging learning task. Existing approaches for discovering latent structures often require the unknown number of hidden states as an input. In this paper, we propose a quartet based approach which is agnostic to this number. The key contribution is a novel rank characterization of the tensor associated with th...

متن کامل

Spectral Methods for Learning Multivariate Latent Tree Structure

This work considers the problem of learning the structure of multivariate linear tree models, whichinclude a variety of directed tree graphical models with continuous, discrete, and mixed latent variablessuch as linear-Gaussian models, hidden Markov models, Gaussian mixture models, and Markov evolu-tionary trees. The setting is one where we only have samples from certain observe...

متن کامل

How to Train Deep Variational Autoencoders and Probabilistic Ladder Networks

Variational autoencoders are a powerful framework for unsupervised learning. However, previous work has been restricted to shallow models with one or two layers of fully factorized stochastic latent variables, limiting the flexibility of the latent representation. We propose three advances in training algorithms of variational autoencoders, for the first time allowing to train deep models of up...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006